Kernel Knockoffs Selection for Nonparametric Additive Models
نویسندگان
چکیده
Thanks to its fine balance between model flexibility and interpretability, the nonparametric additive has been widely used, variable selection for this type of frequently studied. However, none existing solutions can control false discovery rate (FDR) unless sample size tends infinity. The knockoff framework is a recent proposal that address issue, but few are directly applicable models. In article, we propose novel kernel knockoffs procedure model. We integrate three key components: knockoffs, subsampling stability, random feature mapping function approximation. show proposed method guaranteed FDR any size, achieves power approaches one as demonstrate efficacy our through intensive simulations comparisons with alternative solutions. Our thus, makes useful contributions methodology selection, FDR-based inference, well knockoffs. Supplementary materials article available online.
منابع مشابه
Variable Selection in Nonparametric Additive Models.
We consider a nonparametric additive model of a conditional mean function in which the number of variables and additive components may be larger than the sample size but the number of nonzero additive components is "small" relative to the sample size. The statistical problem is to determine which additive components are nonzero. The additive components are approximated by truncated series expan...
متن کاملNonparametric Inferences for Additive Models
Additive models with backfitting algorithms are popular multivariate nonparametric fitting techniques. However, the inferences of the models have not been very well developed, due partially to the complexity of the backfitting estimators. There are few tools available to answer some important and frequently asked questions, such as whether a specific additive component is significant or admits ...
متن کاملEstimation and Variable Selection in Additive Nonparametric Regression Models 1
Additive regression models have been shown to be useful in many situations. Numerical estimation of these models is usually done using the back-tting technique. This iterative numerical procedure converges very fast but has the disadvantage of a complicated`hat matrix.' This paper proposes an estimator with an explicit`hat matrix' which does not use backktting. The asymptotic normality of the e...
متن کاملNonparametric Bayesian Kernel Models
Kernel models for classification and regression have emerged as widely applied tools in statistics and machine learning. We discuss a Bayesian framework and theory for kernel methods, providing a new rationalization of kernel regression based on nonparametric Bayesian models. Functional analytic results ensure that such a nonparametric prior specification induces a class of functions that span ...
متن کاملBandwidth selection for nonparametric kernel testing
We propose a sound approach to bandwidth selection in nonparametric kernel testing. The main idea is to find an Edgeworth expansion of the asymptotic distribution of the test concerned. Due to the involvement of a kernel bandwidth in the leading term of the Edgeworth expansion, we are able to establish closed–form expressions to explicitly represent the leading terms of both the size and power ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of the American Statistical Association
سال: 2022
ISSN: ['0162-1459', '1537-274X', '2326-6228', '1522-5445']
DOI: https://doi.org/10.1080/01621459.2022.2039671